Interactive E-Government: Evaluating the Web Site of the UK Inland Revenue
نویسندگان
چکیده
As government organizations have begun increasingly to communicate and interact with citizens via the Web, providing services has demanded acute understanding of the requirements of users and appropriate tailoring of solutions. In this paper, we examine the results of a survey of the quality of a Web site provided by the UK Government. The site is that of the Inland Revenue. The survey was administered directly after the launch of a new system to enable online submission of self-assessed tax returns. The instrument, E-Qual, draws on previous work in Web site usability, information quality, and service interaction quality to provide a rounded framework for assessing e-government offerings. The metrics and qualitative comments provide some detailed insights into the perceptions of users who attempted to interact with the online taxation system. The results point to specific areas in need of development in the Web site, which are found to be consistent with initiatives launched recently by the Inland Revenue. INTRODUCTION The implications of Web-based services have now moved well beyond e-commerce, and are being felt in many other areas of organisation. One such area is electronic (e-) government. Since the late-1990s, substantial government services have been provided via the Web in countries such as the US, UK, New Zealand, Australia, Portugal, Italy, Malaysia and Singapore. Digital government has huge potential benefits. Government transcends all sectors of society, and not only provides the legal, political, and economic infrastructure to support other sectors, but also exerts considerable influence on the social factors that add to their development (Elmagarmid, 2001). E-government thus has the potential to profoundly transform people’s perceptions of civil and political interactions with their governments. Even though we may see further convergence of e-commerce and e-government services (Kubicek and Hagen, 2001), unlike e-commerce, e-government services must – in most societies – be accessible to all. Through the Web, expectations of the service levels that e-government sites must provide have been raised considerably (Cook, 2000). This research utilizes the E-Qual method (previously called WebQual) to assess the quality of a specific national Web site provided by the UK Government. The Web site is that of the Inland Revenue – a site relating to UK tax policy and administration. E-Qual was developed originally as an instrument for assessing user perceptions of the quality of e-commerce Web sites. The instrument has been under development since the early part of 1998 and has evolved via a process of iterative refinement in different e-commerce and e-government domains (e.g. see Barnes and Vidgen, 2001a; 2001b; 2002). Most recently, the instrument has been used in areas of UK, New Zealand and cross-national Government. The method turns qualitative customer assessments into quantitative metrics that are useful for management decision-making. Typically, the tool allows comparisons to be made for the same organization over time or between organizations in an industry. The Web services examined in this research include transaction-based interaction via the submission of self-assessed tax returns. While e-government can provide communication, transaction and integration of administrative services, many countries are not making extensive use of the Web. A study by the Cyberspace Policy Research Group (CyPRG) suggests that the 1999 global average score for information transparency is less than 50% and for interactivity it is less than 25% (La Porte et al., 2001). Although there appears to be less progress with transaction-based services, a Gartner Research (2001) survey of European countries showed that the demand by citizens for information massively outweighs the demand for interactivity. The research reported here focuses on this important issue, drawing specific attention to the perceptions of interactive e-government Web site users. The paper also includes a comparison of the conclusions of this research with the developments made by Inland Revenue (independently of this study) and launched in an enhanced Web site in the second half of 2003. The structure of the paper is as follows. In the next section we describe the background to the research and the methodology used. Sections three and four reports the quantitative and qualitative data findings respectively, which are then discussed and interpreted in section five. Conclusions are drawn in the last section. RESEARCH CONTEXT AND METHODOLOGY USED In this section, we provide some background to the specific study outlined in this paper and an explanation of the specific methodology adopted for evaluating the e-government Web site of the UK Inland Revenue. Background to the Research Project A project to evaluate the quality of the UK Inland Revenue Web site (http://www.ir.gov.uk/) was initiated by the Tax Management Research Network in the early part of 2001 and carried out with the support and co-operation of the Inland Revenue. TMRnet is a network of academic researchers and tax practitioners, launched in 2000, to undertake joint research related to the interface between national tax policy and the practical management of national tax regimes. The aim is to further understanding of how to manage taxation departments: that is to improve tax administration and customer service, including the exploration of opportunities afforded by new technologies, such as the Internet. The TMRnet Steering Group includes academics from the Universities of Bath and Nottingham, senior members of the UK Inland Revenue and H.M. Customs and Excise and members of the Fiscal Affairs Division of the OECD. Aside from information provision, a major part of the Inland Revenue’s Web site is the launch of a self-assessment facility for tax returns, first used for the 1999 to 2000 financial year to submit returns by 5 April 2001. Thus, the site provides a high degree of interactivity and the possibility for transactions. The online self-assessment facility is a major part of the Inland Revenue’s £200 million e-strategy (HMSO, 2001) aimed at delivering fifty per cent of services electronically by 31 December 2002. In addition, the long-term aims are to provide all services electronically by 31 December 2005, by which time the take-up of services should be 50 per cent. The proposed benefits for taxpayers of using the Self Assessment service are accuracy, convenience, confirmation of submission, and faster processing of any tax refunds (HMSO, 2002). Whilst it is difficult to predict confidently the savings achievable, the department estimates that when take up reaches 50 per cent across all activities, this might enable efficiency saving equivalent to some 1,300 posts. The evaluation of the IR Web site was undertaken using the E-Qual instrument, developed at the University of Bath, and was carried out during the period 1 August through 30 September 2001. In this report we present the results of the evaluation of the IR Web site using quantitative results produced through analysis of the E-Qual data. The quantitative analysis is supplemented by qualitative comments of the respondents to provide triangulation of the results and a deeper insight into user attitudes. Information is also included about development of the Web site by the Inland Revenue in order to contextualise the findings and to provide a comparison with actual subsequent developments. Previous Experiences with the Evaluation Instrument A review of the literature on Web site evaluation revealed no comprehensive instruments aimed specifically at e-government Web services. Therefore, and at the request of the Inland Revenue, we adopted the E-Qual instrument, adapting the format for interactive and noninteractive users. By adapting a previously developed and validated instrument, benefits accrue in the form of improved validity, the ability to compare results from previous studies with the current study and a movement towards building a cumulative tradition of research (Straub and Carlson, 1989; Malhotra and Grover, 1998). E-Qual is based on quality function deployment (QFD), which is a “structured and disciplined process that provides a means to identify and carry the voice of the customer through each stage of product and or service development and implementation” (Slabey, 1990). Applications of QFD start with capturing the ‘voice of the customer’ the articulation of quality requirements using words that are meaningful to the customer. These qualities are then fed back to customers and form the basis of an evaluation of the quality of a product or service. E-Qual differs from studies that emphasise site characteristics or features (Kim and Eom, 2002), which are used as part of later processes in QFD. In the context of E-Qual, Web site users are asked to rate target sites against each of a range of qualities and to rate each of the qualities for importance. Although the qualities in E-Qual are designed to be subjective, there is a significant amount of data analysis using quantitative techniques, for example, to conduct tests of the reliability of the E-Qual instrument. E-Qual has been under development since 1998 and has undergone numerous iterations. The development of E-Qual is discussed fully elsewhere (see Barnes and Vidgen, 2001a; 2001b; 2002; 2003). E-Qual 4.0, as shown in Table 1, draws on research from three core areas: • Information quality from mainstream IS research. A core part of the E-Qual instrument, from version 1.0, was the quality of online information. The questions developed in this segment of E-Qual build on literature focused on information, data and system quality, including Bailey and Pearson (1983), Strong et al. (1997) and Wang (1998). • Interaction and service quality from marketing, e-commerce and IS service quality research. Bitner et al. (1990, p. 72) adopt Shostack’s (1985) definition of a service encounter as “a period of time during which a consumer directly interacts with a service” and note that these interactions need not be interpersonal a service encounter can occur without a human interaction element. Bitner et al. (1990) also recognize that “many times that interaction is the service from the customer’s point of view” (p. 71). We suggest that interaction quality is equally important to the success of e-businesses as it is to “bricks and mortar” organizations (and possibly more so given the removal of the interpersonal dimension). In version 2.0 of the instrument we therefore extended the interaction aspects by adapting and applying the work on service quality, chiefly SERVQUAL (Parasuraman et al., 1985; 1988; Parasuraman, 1995; Zeithaml et al., 1990; 1993) and IS SERVQUAL (Pitt et al., 1995; 1997; Kettinger and Lee, 1997; Van Dyke et al., 1997). • Usability from human-computer interaction. In WebQual 4.0 the usability dimension draws from literature in the field of human computer interaction (Davis, 1989; 1993; Nielsen, 1993) and more latterly Web usability (Nielsen, 1999; 2000; Spool et al., 1999). Usability is concerned with the pragmatics of how a user perceives and interacts with a Web site: is it easy to navigate? Is the design appropriate to the type of site? It is not, in the first instance, concerned with design principles such as the use of frames or the percentage of white space, although these are concerns for the Web site designer who is charged with improving usability. Notwithstanding, we have used quality workshops at every stage of E-Qual’s development to ensure that the qualities were relevant, particularly where they relate to pre-Internet literature and new organisational or industrial settings, such as e-government. In addition to the applications of specific versions of E-Qual for electronic commerce in business-to-consumer and consumer-to-consumer settings (see Barnes and Vidgen, 2001a; 2001b; 2002), the instrument has also been used in several other e-government areas. Most recently the instrument has been used to evaluate the following: • The Forum for Strategic Management Knowledge Exchange (FSMKE) – a site relating to international tax policy and administration provided by the Organisation for Economic Cooperation and Development (OECD). The FSMKE Web site was first evaluated in April May 2001 and then, following a Web site redesign exercise, the new site was reevaluated in the period July to September 2001. In our sample we collected data from a variety of FSMKE members, including the UK, Australia, Canada, Japan and The Netherlands. The multi-stakeholder analysis of the Web site redesign in this case helped to enhance understanding of how quality is perceived differently among different groups rather than treating all site users as a homogeneous group. The perspectives of the range of international members emphasize the importance of a full understanding of how different users interact with the site when attempting redevelopment; an improvement for one group might be perceived as a lessening in quality for another group. • The Alcohol Advisory Council (ALAC) of New Zealand and Alcohol Concern in the UK. ALAC is a Government funded, crown owned entity whose primary objective is “to promote moderation in the use of alcohol and to develop and promote strategies that will reduce alcohol related problems for the nation”. Alcohol Concern is a registered charity, partly funded by Government, and the national voluntary agency on alcohol misuse. It plays a key role in promoting and advising on the development of national alcohol policy and in promoting public awareness of alcohol issues. The ALAC site was benchmarked against its UK equivalent, Alcohol Concern, to provide a comparison and to give insight into potential differences in perceptions of Web site quality that are associated with cultural aspects. The results show that the New Zealand respondents rated the ALAC site considerably higher than its UK counterpart, Alcohol Concern, with overall WebQual indices of 71% and 61% respectively. By contrast, the UK respondents viewed the Alcohol Concern site more favourably in relative terms with WebQual indices of 70% (Alcohol Concern) and 68% (ALAC). The findings lend some early evidence for differences of perception that are culturally-based with associated implications for Web site design in organizations that are operating in more than one geographic region. Design of the Evaluation The standard E-Qual instrument, previously called WebQual, contains 23 questions (Barnes and Vidgen, 2002). These are shown in Table 1. Three of the questions relate to personal information and making transactions: • Question 17: It feels safe to complete transactions • Question 18: My personal information feels secure • Question 22: I feel confident that goods/services will be delivered as promised These three questions are relevant to respondents using the self-assessment facilities of the IR Web site but not to those who are using the site for information gathering purposes only. By self-assessment, we are referring to the online submission of tax returns that have been processed by the taxpayer using the self-assessment guidelines. The interaction questions were qualified with the instruction to “please tick n/a if you have not used the Internet service for self-assessment or the Internet service for PAYE”. This allows the data set to be divided between “information gatherers” and “interactors”. The survey of Web site quality for the IR was conducted using an Internet-based questionnaire. The home page of the questionnaire had instructions and guidelines for completion of the instrument. From the home page the user opens a separate window (control panel) containing the Web site qualities to be assessed. The control panel allows the user to switch the contents of the target window between the instruction page, the IR Web site, and the quality dictionary. The online quality dictionary is linked to the question number, allowing the respondent to get a definition for any particular quality. Users were asked to rate the IR site for each quality using a scale ranging from 1 (strongly disagree) to 7 (strongly agree). Users are also asked to rate the importance of the quality to them, again using a 1 (least important) to 7 (most important) scale. Open comments were encouraged and a remarkably high proportion of respondents – 65% – took the effort to provide an additional comment on the site. INSERT TABLE 1 ABOUT HERE The evaluation resulted in 420 usable responses. Demographic and other respondent information are shown in Table 2. The respondents were typically highly experienced and intensive users of the Internet, although not intensive users of the IR Web site. The majority of respondents were male (71%) and of a working age. 10% use the IR site daily. Agents and accountants comprised 15.5% of respondents, while 60% categorized themselves as “other”. INSERT TABLE 2 ABOUT HERE ANALYSIS AND DISCUSSION OF QUANTITATIVE RESULTS This section reports on the results of the survey, using a variety of statistical methods for data analysis. Discussion of Summary Data The data were analysed according to the degree of interaction of the user. The questionnaire asked respondents to answer questions 17, 18 and 22 only if they had had full interaction with the site, such as online submission of a tax return. Data collected are summarized in Tables 3 and 4. Note that at this stage, we have not presented any categories for the questions (this is discussed below). The importance scores give the average importance ranking for each question, for each group (‘interact’ refers to those who answered questions 17, 18 and 22, where n = 264, and ‘no int.’ to those who did not, where n = 156), based on all of the responses. In addition, the per question average scores for each of the classifications (‘interact’ and ‘do not interact’) is given along with the standard error of the mean. INSERT TABLE 3 ABOUT HERE Interactive users Referring to Table 3, we see some interesting patterns in the data. In terms of the importance ratings of particular questions, there are some useful groupings to note. Overall, those questions considered most important, e.g. above upper quartile of 6.10, are all about ease of use, safety of personal information, and accurate, trusted and pertinent content. Here we find, in descending order of importance, questions 9, 10, 4, 13, 18 and 12. At the other end of the spectrum, those questions considered least important, i.e., below the overall 5.36 lower quartile, are based around reputation and the “look and feel” of the site in terms of user empathy and site design. Specifically, questions 20, 5, 19, 6, 16 and 8 are in ascending order of importance. Other questions are in between, and the median is 5.93. The results suggest that there are specific priorities in the qualities demanded from the IR Web site by users. Getting easy access to 'good' information appears paramount, whilst certain other aspects that may be important for some commercial sites, such as design aesthetics and building a networked community experience for users to return to, are not so important. Interestingly, reputation is not considered important, presumably because the Inland Revenue is a known Government body. Non-interactive users For the users who did not interact fully with the IR Web site (questions 17, 18, and 22 are excluded), the resultant data on site quality yield some similar importance rankings. However, there were some changes in priorities for questions. In the upper quartile range (6.39 and above) question 11 (timely information) replaces the missing question 18, giving yet greater emphasis on information quality. At the other end of the scale, questions below the lower quartile (5.28) are identical, although the ordering is slightly different, i.e., the least important questions again refer to soft issues of empathy and aesthetics. The picture for interactive and non-interactive users is therefore remarkably similar with the key difference relating to the security of personal information (question 18) for interactive users. Weighted Scores and the E-Qual Index The unweighted scores in Table 3 give an idea of the strengths and weaknesses of the IR site, as perceived by the respondents in raw terms. Weighted results serve to accentuate these differences in the direction of user priorities. These are shown in Table 4. One key aim of this approach is to achieve some overall quality rating for the Web site so that we can benchmark the perceptions of site users. The total scores make it difficult to give a standard benchmark for the Web site, especially since questions 17, 18 and 22 are omitted from the responses of non-interactive users. One way to achieve this is to index the total weighted score for each site against the total possible score in that time period (i.e. the total importance for all questions answered multiplied by 7, the maximum rating for a site). The result is expressed as a percentage. A summary of these calculations and totals are given in Table 4. INSERT TABLE 4 ABOUT HERE Overall, we can see quite clearly that the interactive users benchmarked well below the noninteractive users (62% and 72% respectively), a difference of 10 points in the E-Qual Index (EQI). Even more remarkable is that the evaluations of interactive users rated consistently below that of non-interactive users for all questions, with differences ranging from 1 to 18 points. The major areas of difference between interactive and non-interactive users are shown in Table 5. The largest differences relate to usability (items 1, 4, 2, 3), followed by competency and understandable information. To see the bigger picture it is useful to assess how perceptions of quality differ. To this end, the next section uses reliable sub-groupings obtained from previous applications of E-Qual and applies them to the analysis of the IR data set. INSERT TABLE 5 ABOUT HERE Analysing the Differences in Perceptions The data indicates differences in perceptions in terms of E-Qual site quality. Here we examine where these perceived differences have occurred and consider the overall shape of the evaluation of the IR site. Previous research for E-Qual has led to a number of valid and reliable question subgroupings (Barnes and Vidgen, 2002). Briefly, they can be explained as follows: • Usability (questions 1 to 8). Qualities associated with site design and usability; for example, appearance, ease of use and navigation, and the image conveyed to the user. Usability and Design provide two subcategories in the data. • Information quality (questions 9 to 15). The quality of the content of the site: the suitability of the information for the user’s purposes, e.g. accuracy, format, and relevancy. • Service quality (questions 16 to 22). The quality of the service interaction experienced by users as they delve deeper into the site, embodied by the subcategories Trust and Empathy, including items such as reputation, security, personalization and communication with the site owner. These categories provide some useful criteria by which to assess the perceptions of site users. Using the question groupings, we can build a profile of a user group that is easily compared to others. We are now in a position to examine the considerable differences in perceptions of interactive and non-interactive users on the E-Qual Index. As a starting point, the data was summarised around the questionnaire subcategories. Then, and similarly to the E-Qual Index in Table 4, the total score for each category was indexed against the maximum score (based on the importance ratings for questions multiplied by 7). Figure 1 is the result, which rates the two sets of users with these criteria. Note that the trust category is limited to question 16 for the users who ‘do not interact’. Further, the scale has been adjusted to between 40% and 80% to allow for clearer comparison. Clearly the users who do not interact with the site have higher perceptions in all aspects, although the general pattern of site ratings is similar for all users. In absolute terms, for users who ‘do not interact’ all site categories rate quite highly at between 72% and 77%, except for empathy (52%). Although this category also rates lowest in importance, it does indicate an opportunity for building relationships with users. For ‘interactive’ users, empathy, usability and design rate lowest (at 49%, 56% and 61% respectively), with information (68%) and trust (66%) the best rated scores. INSERT FIGURE 1 ABOUT HERE Figure 1 demonstrates that the biggest subcategory differences in perceptions are in usability and design – 16% and 11% respectively. Close behind is information quality – at 9%. The most similar quality perceptions were for empathy – a difference of just 3%. Apparently, interaction with the Inland Revenue site severely affects perceptions of usability and design (as identified in Table 5). This finding is further explored in the comments of site users below. QUALITATIVE RESULTS At an interpretive level, many of the features drawn out in the quantitative findings are supported in the qualitative data drawn from the open comments of respondents. This also adds richness and helps to explain the “why?” behind some of these patterns in the quantitative data. As indicated in Figure 1, numerous areas were open to criticism in the Inland Revenue Web site, particularly by interactive users. Out of 420 responses received there were 274 comments, representing 65% of respondents. Below we present some of the user comments. These are largely in an unadulterated format to give a richer, qualitative context to the E-Qual survey, although they have been grouped into pertinent areas of response. In terms of site design and usability, difficult navigation, links and password access appeared the most common complaints, and there were very large volumes of feedback in this area. Comments included (reproduced verbatim): Getting to the PAYE [pay as you earn] Self Assessment forms is not easy can't we have one BIG button? I have to trawl through pages of stuff I'm not interested in to find it or am I blind? Can't I have a "go back to your form log in" button like I do with travel sites? Oh, and I'm still waiting for my ID weeks after registering I've mailed the help desk. An opportunity lost! This is the least intuitive website I have ever visited. One can go around in circles for days! You have to be very lucky to find and submit your Self Assessment forms on-line!! On page cto/pa6 it says "if you know what forms you need, click here" and you then get taken only to form IHT 200. I knew I wanted IHT 205 and 206. But it took me hours to get them. And it's very irritating we can't send you an e-mail without coming to this point!! Link to the DMG (tax credits manual) is broken no easy way to report this and very, very annoying. Links often missing, even when found on your own search, content and structure seems to change on a random basis!! This site is very unhelpful when it comes to the most important parts of my entry: my UTN [UTR: unique tax reference] and what the passwords were. I used this system last year and it was fine. This year it is a total shambles, as is the IR when it comes to informing people of changes to the logging in system. I have tried 3 times without success to register my return and am now in the situation that I cannot do so before the deadline. I only wish to access the site to submit my self-assessment. I have already started to compile the return on the IR site but now cannot find it. There is the Government Gateway that I can't remember seeing before. I can’t log in with the user ID I have used just some months ago. I don't find the site easy to navigate at all. In fact I would find filling in the form on paper both quicker and easier I think. I had to phone up for the UTR [unique tax reference] as it was not printed on the demand sent to fill in my tax return. I think I am looking for a button that simple says 'Fill in/submit your tax return'. Sorry to be so negative about the web site. I find it hard to locate specific information and have limited success with the search engine which, after a long wait, often returns an 'unavailable page' which doesn't really tell me if there are no matches or if there is a problem with the site. Positive responses to site design and navigation tended to be associated with those who had not attempted sophisticated interaction such as submission of self-assessment forms. There were much fewer of these responses, largely because open comments typically came from those who had a poor experience in interaction. Comments included: Thank you for having such a wonderful web site where I was able to come and get some much-needed information regarding SMP. My company is located in the USA and this is a new experience for me. A good site nice and easy to navigate and to find what you want. Very Useful in the short time I have used it. Particularly useful for downloading forms and I am sure that when the filing of SA [self-assessment] documents by agents becomes operational will prove very popular. The content provided was generally considered to be of high quality. Most of the critical comments regarded the need for further information, or greater detail: The web site is good, giving easy access to important information. I'd like to more information and forms being made available and also you may possibly consider putting some tax education facility on the web so that the interested amateur may learn and also so that students may learn. Let me end by saying THANK YOU for all the hard work that goes into this site. I appreciate it. Overall reasonably good site could be more detailed in some areas dealing with Employers matters. Overall a good site but a pain not having all the leaflets available in the PDF format. Having the manuals in this would be good also, as having each section on a different page is very poor. It makes it more difficult to look for information. Generally good, but I couldn't find detailed information on transfer of principal private residence and letting CGT [capital gains tax] reliefs on transfer of a property between husband and wife. Several comments criticised the quality of information, including the accuracy, currency, ease of understanding, and format: The site is next to useless. One has to know Revenue terminology to stand even half a chance of finding relevant information. I found a couple of inflexibilities & some inaccuracies (over treatment of pension relief) in online SA Form. Also form does not allow enough space for appending comments (255 word limit). It’s excellent to have a definitive reference for tax info. The online forms are excellent although the PDF format has dire usability. Information needs updating, for instance, rates from April 1999 are no use, currents rates needs to be shown and updated. I believe that emailing enquiries to the local tax office would be useful and less time consuming. Frustrated by having to search for so long for information. Also that most of the PDF files have last year's figures. It would be most convenient if leaflets could be ordered centrally through the website (for business). The site looks good and works quickly. Perhaps it's just the nature of the business that makes the experience frustrating. Have now written to MP [member of parliament] over lack of response to complaint of factually incorrect information about "requirement" for Windows operating system to use SA Return service. Basically I can't easily ever find what I'm looking for, search engine hopeless, can’t easily get to what I want to know. HATE PDF so slow and often causes problems. The Web site should be updated for changes to telephone numbers etc. I have 21” monitor and I could not get the text to the size needed to for me to see it get it sorted please. However, there were also numerous comments from users praising the informative nature of the site: Probably the most useful and informative site I visit on a regular basis. Excellent, informative web site. This site is very informative; I find out more from here than I do from my local IR. Thank you. The web site is good, giving easy access to important information. I'd like to more information and forms being made available and also you may possibly consider putting some tax education facility on the web so that the interested amateur may learn and also so that students may learn. Let me end by saying THANK YOU for all the hard work that goes into this site. I appreciate it. Charity sector good to find plenty of info in downloadable PDF format. Easily found info on IT allowances that I required. Thank you for having such a wonderful web site where I was able to come and get some much needed information regarding SMP. My company is located in the USA and this is a new experience for me. From a service perspective, one of the key problems appears to be communicating with the organisation: Apart from this survey I can't find any way of contacting the IR via e-mail. All I want to do is to ask a simple question, i.e. are there any circumstances under which the allowance restriction is not deducted from a person’s personal allowance. Please reply as soon as possible to the above address. P.S. It would be helpful if you had an e-mail address on your web page. 1) Why doesn't the IR deal with queries on line? I appreciate that this is not going to be suitable for all detail queries but you could answer simple questions that don't require detail of an individual's case. Most web sites have this facility now. The reason that I am completing this form is really to communicate a specific point see below. 2) You need to make sure that your Internet site is consistent with your other communications. You've just sent me my Tax Calculation and the accompanying document SA354 said that I could find details of how to pay my tax due at: www.inlandrevenue.gov.uk/howtopay/selfassessment.htm. I used this address and got a statement saying that the page longer exists (and no reference to where I could now find the information). It's unacceptable customer service to tell me where I can find information if you don't actually have the information there. I have gone through the route that you communicated and have not got the information that I need. Requires e-mail facility to get more detailed answers; other than that a good, helpful site. Have you considered offering a mailing list to advise of updates to the site? I'd be particularly interested in one for the Pension Schemes Office. More communication links required. All tax offices should have email links. The other key customer service issues centre on the ability of the IR site to provide what is promised or expected (partly linked the difficulty in submission of returns), and the ability of users to receive a personal service (partly linked to the communication issue above). Has not resolved the issue or my reason for visiting your page. As a complete beginner to taxation, I came to this site expecting to be able to find some kind of "beginner's guide to tax" a simple explanation of taxation rules that will help me deal with my own budget during my first few months in employment. I would like to see an easy comparison chart showing how much tax I should pay for an income of £xxx pounds and then have NI contributions split out. I know the are a lot of different factors but the ability to get quick easy and basic information will allow me to make sound judgement as to if I feel that I have been over or under charged tax contributions and decide to subsequently get in touch with IR. Why can't we just ask questions by e-mail? I have an important but non time-critical question and the answer cannot be found on the site. Why can't I just send an email? It's a generic question with no personalisation component (Is the Industrial CASE portion of the income of a PhD research student taxable?), and would be ideal for email response. Site needs this kind of better interaction if you really want it to feel like 'community'. DISCUSSION Overall, the qualitative data provides an interesting triangulation with the quantitative results of the E-Qual survey. In particular, it helps to explain why there were such radically different perceptions among users who attempted online submission of self-assessed tax returns (“interactors”) and those that were largely concerned with finding certain information (“information seekers”). Typically, the quality of the user experience was significantly less for those users who attempted deeper site interaction. Specifically, the survey showed that users who access the site for information gathering purposes are significantly more satisfied (EQI=72%) with the service than those who attempted to interact through self-assessment (EQI=62%). The major areas of difference relate to usability, navigation, understandable information and communication. The conclusion we draw from this is that the self-assessment interaction damages respondents’ perceptions of the IR Web site. This downgrading of user-perceived quality relates to all aspects, including trust and information quality. Based on the qualitative data, the two key problems determining the differences in perceptions among information seekers and interactors appear to be the following: • Usability of the online self-assessment facility. Open comments supplied by respondents suggest that the self-assessment interaction is complicated by the need to leave the IR site to register for a user id at the Government gateway. Delivery of a password by post creates a delay that can be compounded by the user not being able to locate their unique tax reference (UTR), possibly requiring a further telephone call. Once the user has registered it is not immediately apparent how to find the self-assessment forms on the site (hence the comment that a “big button” is needed on the home page to take the user directly to the forms). All in all, the respondents found the self-assessment process to be cumbersome. • Communicating with the organisation. The second major area of concern was with contacting the IR electronically. Many respondents wished to email the IR with queries but could find no way of doing so via the Web site. The IR do not enter into email correspondence due to concerns about security and privacy, a situation that is unlikely to be resolved until there is widespread adoption of digital certificates and a public key infrastructure. This issue is compounded by the problems experienced when users attempted online self-assessment. Users based overseas are particularly keen to communicate by email due to telephone costs and time zone differences. This lack of accessibility creates further frustration; some respondents resorted to emailing the E-Qual survey email address because it was the only email address they could find. Other areas of concern regarding the site included the accuracy and currency of information (and links), the availability of specific information or online facilities for use by the user (such as online tax calculation or ready reckoner), and the format of information (typically via difficulties with PDF format). However, these other areas were typically not specifically related to the use of the online self-assessment facility, being largely shared by interactive and non-interactive user groups. SUMMARY AND CONCLUSIONS This research has examined an important area of development for digital government – online taxation systems. It focuses on the experiences in the UK surrounding the introduction of an online facility for self-assessed tax returns, and specifically, in evaluating the quality of the associated Web site using E-Qual. E-Qual is a method for assessing the quality of an organization’s electronic offering. The E-Qual Index gives an overall rating of a Web site that is based on user perceptions of quality weighted by importance. Within E-Qual, five factors are used: usability, design, information, trust, and empathy. The quantitative data is typically supplemented by qualitative comments from respondents. In applying the method, we have found a distinct and consistently different rating of the site between two user groups: information seekers and interactors. The latter group involves those who attempted to engage in online self-assessed tax returns, and who typically rated the quality of the site much lower than those who merely sought information. Key problems affecting the perceptions of the interactive users are the usability of the self-assessment facility and difficulty communicating with the organisation. The findings of this research demonstrate the early difficulties experienced by one government department in establishing an online, interactive e-government service. Such difficulties go well beyond those in establishing information transparency, which in most examples appears much easier to achieve (Barnes and Vidgen, 2003; La Porte et al., 2001). The core areas of difficulty in providing e-government services in this case study appear to be usability, especially in accessing and submitting a return, and empathy and personalisation, particularly in understanding the needs of the individual taxpayer, providing easy delivery of the required services, and providing a means for personal contact, should the need arise. The same issues are also likely to be important as other government departments move towards electronic delivery of interactive services. Latest information from the Inland Revenue shows that there has been a major increase in take up in the year ended 5 April 2003 when 329,420 users accessed the system to submit their tax return. High take up of the Inland Revenue’s e-services depends on taxpayers finding some clear benefit for themselves in dealing with the Inland Revenue in that way. The benefits for taxpayers of using the Self Assessment internet service are an assurance that the return is arithmetically correct; convenience; confirmation that the return was received; and faster processing of any tax refunds (HMSO, 2002). However, taxpayers expect further added value from completing their tax returns electronically. Internet users typically look for a time saving such as a simplified form or being able to rely on the department completing many of the questions from existing data on the taxpayer's behalf. Take up of the Internet service for Self Assessment will only improve significantly once on-line forms offer further added value to customers (HMSO, 2002). The research findings suggest that usability (rated only 56 per cent for interactive users) has been a major issue that requires attention. This finding is also borne out in indicative information about submission experiences. The proportion of successful attempts for first time submission reached 44 per cent on average between April and September 2001, and it improved further to an average of 70 per cent for the quarter ending December 2001 (HMSO, 2002). At one stage, the Revenue even asked customers to avoid using the service between 7 and 11 pm, when most people would want to use it, in an effort to help the system cope (Contractor UK, 2002). The second major finding is the need for empathy and personalisation in the delivery of services (empathy rated lowest of all categories, at only 49 per cent for interactive users). Recent developments at the Inland Revenue also support these findings; the Inland Revenue is currently moving from its existing arrangements for taxpayers to file a tax return towards a ‘portal’ environment offering secure personalised services, such as the option for taxpayers to view their account as well as the facility to file a tax return electronically. The new developments by the Inland Revenue implemented in the second half of 2003 support the findings about usability and the need for greater empathy and personalisation. This is especially evident in the redevelopment of the Inland Revenue’s online tax return and the full integration of secure services within the site. Although the Inland Revenue is constantly evaluating both its website and the services hosted, it will take time and resources to fully re-design existing services. Simplification of the form would require legislative change and drawing on data stored elsewhere in the department to complete many of the questions on behalf of users would require new software to link existing computer systems (HMSO, 2002). The implementation of online taxation reported here is thus characterized as the automation of paper forms; to create an effective online taxation service a business process redesign approach will be needed, which will involve the cost and pain typical of such transformational and cross-functional initiatives. A considerable volume of both quantitative and rich, qualitative data has been collectedthrough the research programme outlined in the paper. This is the first in a series of studies ofonline e-government taxation focusing on the UK. Further research is planned to follow upthis phase of data collection with additional phases. Specifically, we wish to conduct furtherdetailed interviews with site users to ascertain in more detail their experiences with the siteand perceptions of quality for online taxation systems. REFERENCESBailey, J. E., and Pearson, S. W. (1983). Development of a tool for measuring and analyzingcomputer user satisfaction. Management Science, 29 (5), 530-44.Barnes, S. J., and Vidgen, R. T. (2001a). An evaluation of cyber-bookshops: the WebQualmethod. International Journal of Electronic Commerce, 6 (1), 11-30.Barnes, S. J., and Vidgen, R. T. (2001b). Assessing the quality of auction Web sites.Proceedings of the Hawaii International Conference on Systems Sciences, Maui, Hawaii,January 4-6.Barnes, S. J., and Vidgen, R. T. (2002). An integrative approach to the assessment of e-commerce quality. Journal of Electronic Commerce Research, 3 (3), 114-127. Barnes, S. J., and Vidgen, R. T. (2003). Measuring Web Site Quality Improvements: a CaseStudy of the Forum on Strategic Management Knowledge Exchange. Industrial Managementand Data Systems, in press.Bitner, M. (1990). Evaluating service encounters: the effects of physical surroundings andemployee responses. Journal of Marketing, 54, 69-82.Contractor UK (2002). Revenue problems continue as self-assessment site fails to cope.http://www.contractoruk.co.uk/article687.shtml, accessed 31 December 2002. Cook, M. E. (2000). What citizens want from e-government. Retrieved August 9, 2001, from:http://www.ctg.albany.edu/resources/htmlrpt/e-government/what_citizens_want.htmlDavis, F. (1989). Perceived usefulness, perceived ease of use, and user acceptance ofinformation technology. MIS Quarterly, 13 (3), 340-51.Davis, F. (1993). User acceptance of information technology: System characteristics, userperceptions, and behavioral impacts. International Journal of Man-Machine Studies, 38, 475-487.Elmagarmid, A. K., and McIver, W. J. (2001). The ongoing march toward digital government.IEEE Computer, 34 (2), 32-8.Kettinger, W. and Lee, C. (1997). Pragmatic perspectives on the measurement of informationsystems service quality. MIS Quarterly, 21, 223-240.Gartner Research (2001). E-Government: What are citizens really looking for? GartnerResearch, London.HMSO (2001). Inland Revenue e-Strategy. HMSO, London.HMSO (2002). e-Revenue. HMSO, London.Jarvenpaa, S. L., Tractinsky, N., and Vitale, M. (2000). Consumer trust in an Internet store.Information Technology and Management, 1 (1), 45-71. Kim, E. B., and Eom, S. B. (2002). Designing effective cyber store user interface. IndustrialManagement and Data Systems, 102 (5), 241-51.Kubicek, H., and Hagen, M., (2001). Integrating e-commerce and e-government: the case ofBremen Online Services. In Prins, J., ed., Designing E-Government, Kluwer LawInternational, The Hague.La Porte, T., Demchak, C., and Friis, C. (2001). Webbing governance: global trends acrossnational-level public agencies. Communications of the ACM, 44 (1), 63-7. Malhotra, M. and Grover, V. (1998). An assessment of survey research in POM: Fromconstruct to theory. Journal of Operations Management, 16 (4), 403–423.Nielsen, J. (1993). Usability Engineering. Morgan Kaufmann, San Francisco.Nielsen, J. (1999). User interface directions for the Web. Communications of the ACM, 42 (1),65-72.Nielsen, J. (2000). Designing Web Usability. New Riders Publishing, Indiana.Parasuraman, A. (1995). Measuring and monitoring service quality. In Glynn, W. and Barnes,J., eds., Understanding Services Management, Wiley, Chichester.Parasuraman, A., Zeithaml, V. A. and Berry, L. (1985). A conceptual model of service qualityand its implications for future research. Journal of Marketing, 49, 41-50.Parasuraman, A., Zeithaml, V. A. and Berry, L. (1988). SERVQUAL: A multiple-item scalefor measuring consumer perceptions of service quality. Journal of Retailing, 64 (1), 12-40.Pitt, L., Watson, R., and Kavan, C. (1995). Service quality: a measure of information systemseffectiveness. MIS Quarterly, 19 (2), 173-87.Pitt, L., Watson, R. and Kavan, C. (1997). Measuring information systems service quality:Concerns for a complete canvas. MIS Quarterly, 21, 209-221.Shostack, G. (1985). Planning the service encounter. In Czepiel, J. Solomon, M. and Surprenant, C., eds., The Service Encounter, Lexington Books, Lexington, MA.Slabey, R. (1990). QFD: A basic primer. Excerpts from the implementation manual for thethree day QFD workshop. Transactions from the Second Symposium on Quality FunctionDeployment, Novi, Michigan, June 18-19.Spool, J., Scanlon, T., Schroeder, W., Snyder, C. and DeAngelo, T. (1999). Web SiteUsability: a Designer’s Guide. Morgan Kaufmann, San Francisco.Straub, D.W. and Carlson, C. L. (1989). Validating instruments in MIS research. MISQuarterly, 13 (2), 147–169. Strong, D., Lee, Y., and Wang, R. (1997). Data quality in context. Communications of theACM, 40 (5), 103-10.Van Dyke, T., Kappelman, L. and Prybutok, V. (1997). Measuring information systemsservice quality: Concerns on the use of the SERVQUAL questionnaire. MIS Quarterly, 21,195-208.Wang, R. Y. (1998). A product perspective on Total Data Quality Management.Communications of the ACM, 41 (2), 58-65.Zeithaml, V. A., Parasuraman, A., and Berry, L. (1990). Delivering Quality Service:Balancing Customer Perceptions and Expectations, The Free Press, New York.Zeithaml, V. A., Berry, L. and Parasuraman, A. (1993). The nature and determinants ofcustomer expectations of service. Journal of the Academy of Marketing Science, 21 (1), 1-12. Table 1: The E-Qual Questionnaire CategoryQuestionsUsability1. I find the site easy to learn to operate2. My interaction with the site is clear and understandable3. I find the site easy to navigate4. I find the site easy to use5. The site has an attractive appearance6. The design is appropriate to the type of site7. The site conveys a sense of competency8. The site creates a positive experience for meInformation Quality 9. Provides accurate information10. Provides believable information11. Provides timely information12. Provides relevant information13. Provides easy to understand information14. Provides information at the right level of detail15. Presents the information in an appropriate formatService Interaction 16. Has a good reputation17 It feels safe to complete transactions18. My personal information feels secure19. Creates a sense of personalization20. Conveys a sense of community21. Makes it easy to communicate with the organization22. I feel confident that goods/services will be delivered as promisedOVERALL23 Overall view of the Web site Table 2: Respondent demographics and experience QuestionResponseFrequency (%)SectorAgent or accountant15.5Pensioner3.7Small employer5.0SME12.3Student2.9Other60.6SexMale71.4Female28.6AgeUnder 2611.126 to 3523.136 to 4528.546 to 5525.9Over 5511.4Internet experienceLess than 6 months1.86 months to 1 year3.61 to 2 years8.22 to 3 years18.6More than 3 years67.9Internet usageLess than once a month0.3Once a month0.3Once a fortnight0.3Once a week11.5Two or three times a week 2.6Once a day11.8More than once a day73.7IR usageLess than every three months 39.4Once every three months 11.4Once a month13.6Once a week13.6Two or three times a week 12.0Daily10.1 Table 3: Summary of the data mean, standard error and standard deviation Importance (Interact) Interact Importance (No Int.) Do Not InteractNo.Description Mean St. Err. St. Dev. MeanSt. Err.St. Dev. MeanSt. Err.St. Dev. Mean St. Err. St. Dev. 1 I find the site easy to learn to operate6.02 0.10 1.60 3.87 0.14 2.24 6.11 0.09 1.10 5.05 0.13 1.662 My interaction with the site is clear and understandable5.99 0.10 1.58 3.88 0.13 2.11 5.99 0.09 1.08 4.91 0.13 1.603 I find the site easy to navigate6.07 0.09 1.53 3.85 0.14 2.23 6.26 0.08 0.97 4.76 0.14 1.764 I find the site easy to use6.15 0.09 1.50 3.84 0.14 2.33 6.27 0.08 0.95 4.91 0.14 1.725 The site has an attractive appearance4.35 0.11 1.82 4.36 0.11 1.74 4.07 0.13 1.54 4.66 0.11 1.336 The design is appropriate to the type of site4.73 0.11 1.72 4.60 0.11 1.85 4.84 0.13 1.59 5.24 0.12 1.487 The site conveys a sense of competency5.63 0.10 1.67 4.29 0.14 2.20 5.51 0.12 1.50 5.16 0.12 1.508 The site creates a positive experience for me5.28 0.11 1.83 3.44 0.14 2.20 4.99 0.14 1.65 4.20 0.14 1.779 Provides accurate information6.36 0.08 1.31 4.90 0.13 2.05 6.56 0.09 1.14 5.60 0.12 1.4510 Provides believable information6.19 0.09 1.43 5.15 0.12 1.95 6.61 0.07 0.84 5.92 0.10 1.1711 Provides timely information5.99 0.10 1.51 4.70 0.13 2.02 6.48 0.08 0.97 5.34 0.13 1.5612 Provides relevant information6.11 0.09 1.51 4.76 0.13 2.01 6.55 0.08 0.95 5.34 0.14 1.6713 Provides easy to understand information6.13 0.09 1.46 3.98 0.13 2.10 6.30 0.09 1.08 4.95 0.13 1.5814 Provides information at the right level of detail5.87 0.09 1.48 4.20 0.12 1.98 6.19 0.09 1.07 4.70 0.13 1.5715 Presents the information in an appropriate format5.69 0.09 1.47 4.52 0.12 1.96 5.82 0.10 1.22 5.09 0.12 1.4216 Has a good reputation5.24 0.13 1.91 3.97 0.16 2.12 5.34 0.14 1.60 4.93 0.14 1.4817 It feels safe to complete transactions6.04 0.11 1.64 4.90 0.14 2.07 6.42 0.50 1.73 -18 My personal information feels secure6.12 0.11 1.58 5.02 0.13 1.97 6.45 0.55 1.81 -19 Creates a sense of personalization4.55 0.12 1.87 3.32 0.12 1.90 3.70 0.15 1.62 3.20 0.15 1.6120 Conveys a sense of community3.73 0.13 2.00 2.93 0.11 1.75 3.01 0.15 1.69 2.87 0.14 1.5721 Makes it easy to communicate with the organization5.59 0.11 1.70 3.45 0.13 2.04 5.36 0.14 1.66 3.69 0.15 1.7722 I feel confident that goods/services will be delivered as promised 5.86 0.11 1.66 3.87 0.14 2.20 6.22 0.66 1.99 -23 Overall rating of the site3.79 0.13 2.17 5.09 0.12 1.46Note: n=420; interactive users = 264; non-interactive users = 156 Table 4: Weighted scores and E-Qual indices – interactive and non-interactive users InteractNo InteractionNo.DescriptionMax.Score (I) Wgt. ScoreEQI1Max.Score (NI) Wgt. Score EQI2Difference(EQI2 EQI1)1 I find the site easy to learn to operate42.14 23.41 56% 42.80 31.46 74% -18%2 My interaction with the site is clear and understandable41.92 23.61 56% 41.95 30.08 72% -15%3 I find the site easy to navigate42.51 23.77 56% 43.82 30.48 70% -14%4 I find the site easy to use43.06 23.89 55% 43.91 31.41 72% -16%5 The site has an attractive appearance30.43 19.56 64% 28.52 19.79 69% -5%6 The design is appropriate to the type of site33.12 22.65 68% 33.86 26.55 78% -10%7 The site conveys a sense of competency39.42 24.77 63% 38.57 29.79 77% -14%8 The site creates a positive experience for me36.98 18.87 51% 34.95 22.08 63% -12%9 Provides accurate information44.50 31.89 72% 45.95 37.76 82% -11%10 Provides believable information43.30 32.90 76% 46.27 39.74 86% -10%11 Provides timely information41.94 29.23 70% 45.38 35.08 77% -8%12 Provides relevant information42.79 30.24 71% 45.88 35.86 78% -7%13 Provides easy to understand information42.91 25.01 58% 44.11 31.81 72% -14%14 Provides information at the right level of detail41.08 25.42 62% 43.33 29.65 68% -7%15 Presents the information in an appropriate format39.84 26.39 66% 40.71 30.45 75% -9%16 Has a good reputation36.69 22.36 61% 37.35 27.43 73% -12%17 It feels safe to complete transactions42.26 30.67 73% 44.92 --18 My personal information feels secure42.81 31.90 75% 45.18 --19 Creates a sense of personalization31.84 16.00 50% 25.91 13.31 51% -1%20 Conveys a sense of community26.12 12.25 47% 21.06 10.30 49% -2%21 Makes it easy to communicate with the organization39.12 19.80 51% 37.49 20.54 55% -4%22 I feel confident that goods/services will be delivered as promised 41.00 23.23 57% 43.56 --TOTALS:865.76 537.81 62% 875.49 533.59 72% -10%Note: n=420; interactive users = 264; non-interactive users = 156 Table 5: Differences between interactive and non-interactive users QuestionDifference 1I find the site easy to learn to operate-18% 4I find the site easy to use-16% 2My interaction with the site is clear and understandable -15% 3I find the site easy to navigate-14% 7The site conveys a sense of competency-14% 13Provides easy to understand information-14% 40%50%60%70%80%Usability
منابع مشابه
Trust and the Taxman: a Study of the Irish Revenue’s Website Service Quality
This paper describes an ongoing study into the quality of service provided by the Irish Revenue Commisioners’ on-line tax filing and collection system. The Irish Revenue On-Line Service (ROS) site has won several awards. In this study, a version of the widely used SERVQUAL measuring instrument, adapted for use with on-line services, has been modified for the specific case of ROS. The theory beh...
متن کاملGovernment policy, public value and IT outsourcing: The strategic case of ASPIRE
The paper examines government IT outsourcing polices. The paper critiques the concept of ‘the Contract State’, and suggests how more disciplined uses of outsourcing can assist the creation of public value, more broadly conceived. Within the context of international developments, we study the United Kingdom Inland Revenue (IR), Customs and Excise (HMRC) and Department of Social Security (DSS) an...
متن کاملThe Internet and Small Medium-Sized Enterprises (SMES) in Jordan
Because of its global reach, reduction of time restraints, and ability to reduce costs and increase sales, use of the Internet, the World Wide Web (WWW), and related technologies can be a competitive tool in the arsenal of small and medium-sized enterprises (SMEs). Countries the world over are interested in the successful adoption of the Internet by SMEs. Because a vast majority of jobs come fr...
متن کاملRegulatory Compliance and Web Accessibility of UK Parliament Sites
This research seeks to review whether web accessibility and disability laws lead to strong compliance among UK e-government web sites. This study samples 130 sites of the UK members of Parliament using an online accessibility testing tool and determines if the site design complies with disability laws and Web Content Accessibility Guidelines (WCAG). Awareness is raised about issues disabled use...
متن کاملE-government web site enhancement opportunities: a learning perspective
Purpose – This study seeks to develop a framework of analysis that categorizes features of e-government web site design in a matrix of knowledge-acquiring process versus learners’ values. The proposed framework supports a development plan for a cyber governmental web site that may involve all aspects of a learning process. Design/methodology/approach – A framework for analysis is first proposed...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IJEGR
دوره 3 شماره
صفحات -
تاریخ انتشار 2004